# Polish text generation
Bielik 1.5B V3.0 Instruct FP8 Dynamic
Apache-2.0
This is an FP8 dynamic quantization version based on the Bielik-1.5B-v3.0-Instruct model, adapted for vLLM or SGLang inference frameworks. It uses AutoFP8 quantization technology to reduce parameter bytes from 16-bit to 8-bit, significantly lowering disk space and GPU VRAM requirements.
Large Language Model Other
B
speakleash
31
1
Bielik 1.5B V3.0 Instruct GGUF
Apache-2.0
This is a 1.5B parameter instruction fine-tuned model for Polish, developed based on the SpeakLeash Bielik series, suitable for text generation tasks.
Large Language Model Other
B
speakleash
341
3
Bielik 11B V2.3 Instruct GGUF
Apache-2.0
This is the GGUF quantized version of the Polish large language model Bielik-11B-v2.3-Instruct developed by SpeakLeash, suitable for local deployment and use.
Large Language Model
Transformers

B
speakleash
2,203
29
Curie 7B V1
Apache-2.0
Curie-7B-v1 is a model fine-tuned from English large language models (LLMs) for Polish text generation, excelling in Polish text generation and various NLP tasks.
Large Language Model
Transformers Other

C
szymonrucinski
26
5
Plt5 Large
plT5 is a language model based on the T5 architecture, specifically trained on Polish corpus and optimized for the original T5 denoising tasks.
Large Language Model
Transformers Other

P
allegro
1,366
5
Papugapt2
Polish text generation model based on GPT2 architecture, filling the gap in Polish NLP field, trained on multilingual Oscar corpus
Large Language Model Other
P
flax-community
804
11
Featured Recommended AI Models